Linear Versus Quadratic Multivariate Classification.
نویسندگان
چکیده
A linear classification rule (used with equal covariance matrices) was contrasted with a quadratic rule (used with unequal covariance matrices) for accuracy of internal and external classification. The comparisons were made for seven situations which resulted from combining conditions (equal and unequal covariance matrices, and two and three criterion groups) for different sets of real data. For the internal analysis the quadratic rule was superior to the linear rule in all seven situations. For the external analysis the linear rule was nearly as good as or superior to the quadratic rule in all seven situations.
منابع مشابه
Classification via kernel product estimators
Multivariate kernel density estimation is often used as the basis for a nonparametric classification technique. However, the multivariate kernel classifier suffers from the curse of dimensionality, requiring inordinately large sample sizes to achieve a reasonable degree of accuracy in high dimensional settings. A variance stabilising approach to kernel classification can be motivated through an...
متن کاملComparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملSolving and Interpreting Binary Classification Problems in Marketing with SVMs
Marketing problems often involve binary classification of customers into “buyers” versus “non-buyers” or “prefers brand A” versus “prefers brand B”. These cases require binary classification models such as logistic regression, linear, and quadratic discriminant analysis. A promising recent technique for the binary classification problem is the Support Vector Machine (Vapnik (1995)), which has a...
متن کاملAn Exact Test for Multiple Inequality and Equality Constraints in the Linear Regression Model
In this article we consider the linear regression model y = X,B + a, where e is N(O, a21). In this context we derive exact tests of the form H: Rft ? r versus K: f E R K for the case in which a2 iS unknown. We extend these results to consider hypothesis tests of the form H: Rlf ? r1 and R2= = r2 versus K: f E RK. For each of these hypotheses tests we derive several equivalent forms of the test ...
متن کاملLinear and Nonlinear Multivariate Classification of Iranian Bottled Mineral Waters According to Their Elemental Content Determined by ICP-OES
The combinations of inductively coupled plasma-optical emission spectrometry (ICP-OES) and three classification algorithms, i.e., partial least squares discriminant analysis (PLS-DA), least squares support vector machine (LS-SVM) and soft independent modeling of class analogies (SIMCA), for discriminating different brands of Iranian bottled mineral waters, were explored. ICP-OES was used for th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Multivariate behavioral research
دوره 13 2 شماره
صفحات -
تاریخ انتشار 1978